The Linear Quantization Strategy of Quadratic Hebbian-Type Associative Memories and Their Performance Analysis

نویسندگان

  • Chao-Hui Ko
  • Ching-Tsorng Tsai
  • Chishyan Liaw
چکیده

The Quadratic Hebbian-type associative memories have superior performance, but they are more difficult to implement because of their large interconnection values in chips than are the first order Hebbian-type associative memories. In order to reduce the interconnection value for a neural network with M patterns stored, the interconnection value [− M, M] is mapped to [− H, H] linearly, where H is the quantization level. The probability of direct convergence equation of quantized Quadratic Hebbian-type associative memories is derived and the performances are explored. The experiments demonstrate that the quantized network approaches the original recall capacity at a small quantization level. Quadratic Hebbian-type associative memories usually store more patterns; therefore, the strategy of linear quantization reduces interconnection value more efficiently.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A neural network with a single recurrent unit for associative memories based on linear optimization

Recently, some continuous-time recurrent neural networks have been proposed for associative memories based on optimizing linear or quadratic programming problems. In this paper, a simple and efficient neural network with a single recurrent unit is proposed for realizing associative memories. Compared with the existing neural networks for associative memories, the main advantage of the proposed ...

متن کامل

Robust Associative Memories Naturally Occuring From Recurrent Hebbian Networks Under Noise

The brain is a noisy system subject to energy constraints. These facts are rarely taken into account when modelling artificial neural networks. In this paper, we are interested in demonstrating that those factors can actually lead to the appearance of robust associative memories. We first propose a simplified model of noise in the brain, taking into account synaptic noise and interference from ...

متن کامل

Convergence results in an associative memory model

-This paper presents rigorous mathematical proofs for some observed convergence phenomena in an associative memory model introduced by Hopfield (based on Hebbian rules)for storing a number o f random n-bit patterns. The capability o f the model to correct a linear number o f random errors in a bit pattern has been established earlier, but the existence of a large domain o f attraction (correcti...

متن کامل

Word Recognition and Learning based on Associative Memories and Hidden Markov Models

A word recognition architecture based on a network of neural associative memories and hidden Markov models has been developed. The input stream, composed of subword-units like wordinternal triphones consisting of diphones and triphones, is provided to the network of neural associative memories by hidden Markov models. The word recognition network derives words from this input stream. The archit...

متن کامل

High Performance Associative Memories and Structured Weight Dilution

The consequences of two techniques for symmetrically diluting the weights of the standard Hopfield architecture associative memory model, trained using a non-Hebbian learning rule, are examined. This paper reports experimental investigations into the effect of dilution on factors such as: pattern stability and attractor performance. It is concluded that these networks maintain a reasonable leve...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • J. Inf. Sci. Eng.

دوره 27  شماره 

صفحات  -

تاریخ انتشار 2011